Least Angle Regression

نویسندگان

  • Bradley Efron
  • Trevor Hastie
  • Iain Johnstone
چکیده

The purpose of model selection algorithms such as All Subsets, Forward Selection, and Backward Elimination is to choose a linear model on the basis of the same set of data to which the model will be applied. Typically we have available a large collection of possible covariates from which we hope to select a parsimonious set for the efficient prediction of a response variable. Least Angle Regression (”LARS”), a new model selection algorithm, is a useful and less greedy version of traditional forward selection methods. Three main properties are derived. (1) A simple modification of the LARS algorithm implements the Lasso, an attractive version of Ordinary Least Squares that constrains the sum of the absolute regression coefficients; the LARS modification calculates all possible Lasso estimates for a given problem, using an order of magnitude less computer time than previous methods. (2) A different LARS modification efficiently implements Forward Stagewise linear regression, another promising new model selection method; this connection explains the similar numerical results previously observed for the Lasso and Stagewise, and helps understand the properties of both methods, which are seen as constrained versions of the simpler LARS algorithm. (3) A simple approximation for the degrees of freedom of a LARS estimate is available, from which we derive a Cp estimate of prediction error; this allows a principled choice among the range of possible LARS estimates. LARS and its variants are computationally efficient: the paper describes a publicly available algorithm that requires only the same order of magnitude of computational effort as Ordinary Least Squares applied to the full set of covariates.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Least angle and l 1 penalized regression : A review ∗ †

Least Angle Regression is a promising technique for variable selection applications, offering a nice alternative to stepwise regression. It provides an explanation for the similar behavior of LASSO (l1-penalized regression) and forward stagewise regression, and provides a fast implementation of both. The idea has caught on rapidly, and sparked a great deal of research interest. In this paper, w...

متن کامل

Least Angle and L1 Regression: A Review

Least Angle Regression is a promising technique for variable selection applications, offering a nice alternative to stepwise regression. It provides an explanation for the similar behavior of LASSO (L1-penalized regression) and forward stagewise regression, and provides a fast implementation of both. The idea has caught on rapidly, and sparked a great deal of research interest. In this paper, w...

متن کامل

Summary and discussion of: “Exact Post-selection Inference for Forward Stepwise and Least Angle Regression”

In this report we summarize the recent paper [Taylor et al., 2014] which proposes new inference tools for methods that perform variable selection and estimation in an adaptive regression. Although this paper mainly studies forward stepwise regression (FS) and least angle regression (LAR), the approach in this paper is not limited to these cases. This paper describes how to carry out exact infer...

متن کامل

Band Selection Method for Retrieving Soil Lead Content with Hyperspectral Remote Sensing Data

Hyperspectral data offers a powerful tool for predicting soil heavy metal contamination due to its high spectral resolution and many continuous bands. However, band selection is the prerequisite to accurately invert and predict soil heavy metal concentration by hyperspectral data. In this paper, 181 soil samples were collected from the suburb of Nanjing City, and their reflectance spectra and s...

متن کامل

Least Angle Regression and LASSO for Large Datasets

Least-Angle Regression and the LASSO (`1-penalized regression) offer a number of advantages in variable selection applications over procedures such as stepwise or ridge regression, including prediction accuracy, stability and interpretability. We discuss formulations of these algorithms that extend to datasets in which the number of observations could be so large that it would not be possible t...

متن کامل

Discussion of “ Least Angle Regression ” by Efron

Algorithms for simultaneous shrinkage and selection in regression and classification provide attractive solutions to knotty old statistical challenges. Nevertheless, as far as we can tell, Tibshirani’s Lasso algorithm has had little impact on statistical practice. Two particular reasons for this may be the relative inefficiency of the original Lasso algorithm and the relative complexity of more...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002